Conference Proceedings

Grouped-Attention for Content-Selection and Content-Plan Generation

Bayu Distiawan Trisedya, Xiaojie Wang, Jianzhong QI, Rui Zhang, Qingjun Cui

Findings of the Association for Computational Linguistics: EMNLP 2021 | Association for Computational Linguistics | Published : 2021

Abstract

Recent neural data-to-text generation models employ Pointer Networks to explicitly learn content-plan given a set of attributes as input. They use LSTM to encode the input, which assumes a sequential relationship in the input. This may be sub-optimal to encode a set of attributes, where the attributes have a composite structure: the attributes are disordered while each attribute value is an ordered list of tokens. We handle this problem by proposing a neural content-planner that can capture both local and global contexts of such a structure. Specifically, we propose a novel attention mechanism called GSC-attention. A key component of the GSCattention is grouped-attention, which is tokenlevel..

View full abstract

University of Melbourne Researchers